hello
hello
Labels

📌S Retain class distribution for seed 7:
Class 0: 4500
Class 1: 4500
Class 2: 4500
Class 3: 4500
Class 4: 4500
Class 5: 4500
Class 6: 4500
Class 7: 4500
Class 8: 4500
Class 9: 4500

📌S Forget class distribution for seed 7:
Class 0: 500
Class 1: 500
Class 2: 500
Class 3: 500
Class 4: 500
Class 5: 500
Class 6: 500
Class 7: 500
Class 8: 500
Class 9: 500
78090990

📊 Updated class distribution:
Retain set:
  Class 0: 4625
  Class 1: 4625
  Class 2: 4625
  Class 3: 4625
  Class 4: 4625
  Class 5: 4625
  Class 6: 4625
  Class 7: 4625
  Class 8: 4625
  Class 9: 4625
Forget set:
  Class 0: 375
  Class 1: 375
  Class 2: 375
  Class 3: 375
  Class 4: 375
  Class 5: 375
  Class 6: 375
  Class 7: 375
  Class 8: 375
  Class 9: 375
hello
hello
⚠️ Warning: Retain train loader may not be shuffled.
Training Epoch: 1 [256/46250]	Loss: 2.4606	LR: 0.000000
Training Epoch: 1 [512/46250]	Loss: 2.4136	LR: 0.000552
Training Epoch: 1 [768/46250]	Loss: 2.3620	LR: 0.001105
Training Epoch: 1 [1024/46250]	Loss: 2.3678	LR: 0.001657
Training Epoch: 1 [1280/46250]	Loss: 2.3527	LR: 0.002210
Training Epoch: 1 [1536/46250]	Loss: 2.2408	LR: 0.002762
Training Epoch: 1 [1792/46250]	Loss: 2.2548	LR: 0.003315
Training Epoch: 1 [2048/46250]	Loss: 2.2523	LR: 0.003867
Training Epoch: 1 [2304/46250]	Loss: 2.2225	LR: 0.004420
Training Epoch: 1 [2560/46250]	Loss: 2.3032	LR: 0.004972
Training Epoch: 1 [2816/46250]	Loss: 2.2813	LR: 0.005525
Training Epoch: 1 [3072/46250]	Loss: 2.3069	LR: 0.006077
Training Epoch: 1 [3328/46250]	Loss: 2.0661	LR: 0.006630
Training Epoch: 1 [3584/46250]	Loss: 2.0775	LR: 0.007182
Training Epoch: 1 [3840/46250]	Loss: 1.9641	LR: 0.007735
Training Epoch: 1 [4096/46250]	Loss: 2.0164	LR: 0.008287
Training Epoch: 1 [4352/46250]	Loss: 1.9508	LR: 0.008840
Training Epoch: 1 [4608/46250]	Loss: 2.0297	LR: 0.009392
Training Epoch: 1 [4864/46250]	Loss: 1.9416	LR: 0.009945
Training Epoch: 1 [5120/46250]	Loss: 1.8928	LR: 0.010497
Training Epoch: 1 [5376/46250]	Loss: 1.9192	LR: 0.011050
Training Epoch: 1 [5632/46250]	Loss: 1.9511	LR: 0.011602
Training Epoch: 1 [5888/46250]	Loss: 1.7842	LR: 0.012155
Training Epoch: 1 [6144/46250]	Loss: 1.7108	LR: 0.012707
Training Epoch: 1 [6400/46250]	Loss: 1.7892	LR: 0.013260
Training Epoch: 1 [6656/46250]	Loss: 1.8125	LR: 0.013812
Training Epoch: 1 [6912/46250]	Loss: 1.7950	LR: 0.014365
Training Epoch: 1 [7168/46250]	Loss: 1.8804	LR: 0.014917
Training Epoch: 1 [7424/46250]	Loss: 1.7437	LR: 0.015470
Training Epoch: 1 [7680/46250]	Loss: 1.9368	LR: 0.016022
Training Epoch: 1 [7936/46250]	Loss: 1.7934	LR: 0.016575
Training Epoch: 1 [8192/46250]	Loss: 1.8350	LR: 0.017127
Training Epoch: 1 [8448/46250]	Loss: 1.7036	LR: 0.017680
Training Epoch: 1 [8704/46250]	Loss: 1.7496	LR: 0.018232
Training Epoch: 1 [8960/46250]	Loss: 1.5734	LR: 0.018785
Training Epoch: 1 [9216/46250]	Loss: 1.7776	LR: 0.019337
Training Epoch: 1 [9472/46250]	Loss: 1.8503	LR: 0.019890
Training Epoch: 1 [9728/46250]	Loss: 1.7043	LR: 0.020442
Training Epoch: 1 [9984/46250]	Loss: 1.7312	LR: 0.020994
Training Epoch: 1 [10240/46250]	Loss: 1.7327	LR: 0.021547
Training Epoch: 1 [10496/46250]	Loss: 1.8481	LR: 0.022099
Training Epoch: 1 [10752/46250]	Loss: 1.5354	LR: 0.022652
Training Epoch: 1 [11008/46250]	Loss: 1.7323	LR: 0.023204
Training Epoch: 1 [11264/46250]	Loss: 1.7302	LR: 0.023757
Training Epoch: 1 [11520/46250]	Loss: 1.8400	LR: 0.024309
Training Epoch: 1 [11776/46250]	Loss: 1.7081	LR: 0.024862
Training Epoch: 1 [12032/46250]	Loss: 1.6973	LR: 0.025414
Training Epoch: 1 [12288/46250]	Loss: 1.6909	LR: 0.025967
Training Epoch: 1 [12544/46250]	Loss: 1.6121	LR: 0.026519
Training Epoch: 1 [12800/46250]	Loss: 1.5569	LR: 0.027072
Training Epoch: 1 [13056/46250]	Loss: 1.6175	LR: 0.027624
Training Epoch: 1 [13312/46250]	Loss: 1.7018	LR: 0.028177
Training Epoch: 1 [13568/46250]	Loss: 1.5636	LR: 0.028729
Training Epoch: 1 [13824/46250]	Loss: 1.6587	LR: 0.029282
Training Epoch: 1 [14080/46250]	Loss: 1.6918	LR: 0.029834
Training Epoch: 1 [14336/46250]	Loss: 1.4669	LR: 0.030387
Training Epoch: 1 [14592/46250]	Loss: 1.5699	LR: 0.030939
Training Epoch: 1 [14848/46250]	Loss: 1.5934	LR: 0.031492
Training Epoch: 1 [15104/46250]	Loss: 1.5020	LR: 0.032044
Training Epoch: 1 [15360/46250]	Loss: 1.5595	LR: 0.032597
Training Epoch: 1 [15616/46250]	Loss: 1.5362	LR: 0.033149
Training Epoch: 1 [15872/46250]	Loss: 1.6021	LR: 0.033702
Training Epoch: 1 [16128/46250]	Loss: 1.6423	LR: 0.034254
Training Epoch: 1 [16384/46250]	Loss: 1.5248	LR: 0.034807
Training Epoch: 1 [16640/46250]	Loss: 1.4726	LR: 0.035359
Training Epoch: 1 [16896/46250]	Loss: 1.5362	LR: 0.035912
Training Epoch: 1 [17152/46250]	Loss: 1.6723	LR: 0.036464
Training Epoch: 1 [17408/46250]	Loss: 1.4485	LR: 0.037017
Training Epoch: 1 [17664/46250]	Loss: 1.5433	LR: 0.037569
Training Epoch: 1 [17920/46250]	Loss: 1.6224	LR: 0.038122
Training Epoch: 1 [18176/46250]	Loss: 1.4739	LR: 0.038674
Training Epoch: 1 [18432/46250]	Loss: 1.6572	LR: 0.039227
Training Epoch: 1 [18688/46250]	Loss: 1.8277	LR: 0.039779
Training Epoch: 1 [18944/46250]	Loss: 1.6102	LR: 0.040331
Training Epoch: 1 [19200/46250]	Loss: 1.6387	LR: 0.040884
Training Epoch: 1 [19456/46250]	Loss: 1.6372	LR: 0.041436
Training Epoch: 1 [19712/46250]	Loss: 1.5279	LR: 0.041989
Training Epoch: 1 [19968/46250]	Loss: 1.6449	LR: 0.042541
Training Epoch: 1 [20224/46250]	Loss: 1.5658	LR: 0.043094
Training Epoch: 1 [20480/46250]	Loss: 1.6878	LR: 0.043646
Training Epoch: 1 [20736/46250]	Loss: 1.4728	LR: 0.044199
Training Epoch: 1 [20992/46250]	Loss: 1.5167	LR: 0.044751
Training Epoch: 1 [21248/46250]	Loss: 1.6033	LR: 0.045304
Training Epoch: 1 [21504/46250]	Loss: 1.4424	LR: 0.045856
Training Epoch: 1 [21760/46250]	Loss: 1.4360	LR: 0.046409
Training Epoch: 1 [22016/46250]	Loss: 1.4766	LR: 0.046961
Training Epoch: 1 [22272/46250]	Loss: 1.3967	LR: 0.047514
Training Epoch: 1 [22528/46250]	Loss: 1.4167	LR: 0.048066
Training Epoch: 1 [22784/46250]	Loss: 1.4794	LR: 0.048619
Training Epoch: 1 [23040/46250]	Loss: 1.4795	LR: 0.049171
Training Epoch: 1 [23296/46250]	Loss: 1.5082	LR: 0.049724
Training Epoch: 1 [23552/46250]	Loss: 1.4192	LR: 0.050276
Training Epoch: 1 [23808/46250]	Loss: 1.3623	LR: 0.050829
Training Epoch: 1 [24064/46250]	Loss: 1.4181	LR: 0.051381
Training Epoch: 1 [24320/46250]	Loss: 1.5234	LR: 0.051934
Training Epoch: 1 [24576/46250]	Loss: 1.6145	LR: 0.052486
Training Epoch: 1 [24832/46250]	Loss: 1.6240	LR: 0.053039
Training Epoch: 1 [25088/46250]	Loss: 1.2792	LR: 0.053591
Training Epoch: 1 [25344/46250]	Loss: 1.4143	LR: 0.054144
Training Epoch: 1 [25600/46250]	Loss: 1.4787	LR: 0.054696
Training Epoch: 1 [25856/46250]	Loss: 1.4806	LR: 0.055249
Training Epoch: 1 [26112/46250]	Loss: 1.5677	LR: 0.055801
Training Epoch: 1 [26368/46250]	Loss: 1.4534	LR: 0.056354
Training Epoch: 1 [26624/46250]	Loss: 1.4018	LR: 0.056906
Training Epoch: 1 [26880/46250]	Loss: 1.5802	LR: 0.057459
Training Epoch: 1 [27136/46250]	Loss: 1.4918	LR: 0.058011
Training Epoch: 1 [27392/46250]	Loss: 1.5343	LR: 0.058564
Training Epoch: 1 [27648/46250]	Loss: 1.4341	LR: 0.059116
Training Epoch: 1 [27904/46250]	Loss: 1.5459	LR: 0.059669
Training Epoch: 1 [28160/46250]	Loss: 1.4275	LR: 0.060221
Training Epoch: 1 [28416/46250]	Loss: 1.2841	LR: 0.060773
Training Epoch: 1 [28672/46250]	Loss: 1.3149	LR: 0.061326
Training Epoch: 1 [28928/46250]	Loss: 1.3923	LR: 0.061878
Training Epoch: 1 [29184/46250]	Loss: 1.5811	LR: 0.062431
Training Epoch: 1 [29440/46250]	Loss: 1.4778	LR: 0.062983
Training Epoch: 1 [29696/46250]	Loss: 1.4069	LR: 0.063536
Training Epoch: 1 [29952/46250]	Loss: 1.5618	LR: 0.064088
Training Epoch: 1 [30208/46250]	Loss: 1.3085	LR: 0.064641
Training Epoch: 1 [30464/46250]	Loss: 1.3518	LR: 0.065193
Training Epoch: 1 [30720/46250]	Loss: 1.5616	LR: 0.065746
Training Epoch: 1 [30976/46250]	Loss: 1.3349	LR: 0.066298
Training Epoch: 1 [31232/46250]	Loss: 1.4353	LR: 0.066851
Training Epoch: 1 [31488/46250]	Loss: 1.2687	LR: 0.067403
Training Epoch: 1 [31744/46250]	Loss: 1.5487	LR: 0.067956
Training Epoch: 1 [32000/46250]	Loss: 1.4756	LR: 0.068508
Training Epoch: 1 [32256/46250]	Loss: 1.3923	LR: 0.069061
Training Epoch: 1 [32512/46250]	Loss: 1.4931	LR: 0.069613
Training Epoch: 1 [32768/46250]	Loss: 1.3549	LR: 0.070166
Training Epoch: 1 [33024/46250]	Loss: 1.4407	LR: 0.070718
Training Epoch: 1 [33280/46250]	Loss: 1.4381	LR: 0.071271
Training Epoch: 1 [33536/46250]	Loss: 1.3972	LR: 0.071823
Training Epoch: 1 [33792/46250]	Loss: 1.2391	LR: 0.072376
Training Epoch: 1 [34048/46250]	Loss: 1.3613	LR: 0.072928
Training Epoch: 1 [34304/46250]	Loss: 1.1853	LR: 0.073481
Training Epoch: 1 [34560/46250]	Loss: 1.2594	LR: 0.074033
Training Epoch: 1 [34816/46250]	Loss: 1.3536	LR: 0.074586
Training Epoch: 1 [35072/46250]	Loss: 1.1780	LR: 0.075138
Training Epoch: 1 [35328/46250]	Loss: 1.2889	LR: 0.075691
Training Epoch: 1 [35584/46250]	Loss: 1.2538	LR: 0.076243
Training Epoch: 1 [35840/46250]	Loss: 1.2712	LR: 0.076796
Training Epoch: 1 [36096/46250]	Loss: 1.2452	LR: 0.077348
Training Epoch: 1 [36352/46250]	Loss: 1.5398	LR: 0.077901
Training Epoch: 1 [36608/46250]	Loss: 1.2203	LR: 0.078453
Training Epoch: 1 [36864/46250]	Loss: 1.4014	LR: 0.079006
Training Epoch: 1 [37120/46250]	Loss: 1.2833	LR: 0.079558
Training Epoch: 1 [37376/46250]	Loss: 1.3218	LR: 0.080110
Training Epoch: 1 [37632/46250]	Loss: 1.5736	LR: 0.080663
Training Epoch: 1 [37888/46250]	Loss: 1.3092	LR: 0.081215
Training Epoch: 1 [38144/46250]	Loss: 1.3080	LR: 0.081768
Training Epoch: 1 [38400/46250]	Loss: 1.4644	LR: 0.082320
Training Epoch: 1 [38656/46250]	Loss: 1.4445	LR: 0.082873
Training Epoch: 1 [38912/46250]	Loss: 1.2435	LR: 0.083425
Training Epoch: 1 [39168/46250]	Loss: 1.5536	LR: 0.083978
Training Epoch: 1 [39424/46250]	Loss: 1.0703	LR: 0.084530
Training Epoch: 1 [39680/46250]	Loss: 1.6578	LR: 0.085083
Training Epoch: 1 [39936/46250]	Loss: 1.7260	LR: 0.085635
Training Epoch: 1 [40192/46250]	Loss: 1.4872	LR: 0.086188
Training Epoch: 1 [40448/46250]	Loss: 1.5244	LR: 0.086740
Training Epoch: 1 [40704/46250]	Loss: 1.4997	LR: 0.087293
Training Epoch: 1 [40960/46250]	Loss: 1.5024	LR: 0.087845
Training Epoch: 1 [41216/46250]	Loss: 1.5008	LR: 0.088398
Training Epoch: 1 [41472/46250]	Loss: 1.4306	LR: 0.088950
Training Epoch: 1 [41728/46250]	Loss: 1.4222	LR: 0.089503
Training Epoch: 1 [41984/46250]	Loss: 1.4695	LR: 0.090055
Training Epoch: 1 [42240/46250]	Loss: 1.2311	LR: 0.090608
Training Epoch: 1 [42496/46250]	Loss: 1.2396	LR: 0.091160
Training Epoch: 1 [42752/46250]	Loss: 1.3517	LR: 0.091713
Training Epoch: 1 [43008/46250]	Loss: 1.4590	LR: 0.092265
Training Epoch: 1 [43264/46250]	Loss: 1.3309	LR: 0.092818
Training Epoch: 1 [43520/46250]	Loss: 1.3034	LR: 0.093370
Training Epoch: 1 [43776/46250]	Loss: 1.3684	LR: 0.093923
Training Epoch: 1 [44032/46250]	Loss: 1.2821	LR: 0.094475
Training Epoch: 1 [44288/46250]	Loss: 1.2583	LR: 0.095028
Training Epoch: 1 [44544/46250]	Loss: 1.3653	LR: 0.095580
Training Epoch: 1 [44800/46250]	Loss: 1.2673	LR: 0.096133
Training Epoch: 1 [45056/46250]	Loss: 1.1194	LR: 0.096685
Training Epoch: 1 [45312/46250]	Loss: 1.1355	LR: 0.097238
Training Epoch: 1 [45568/46250]	Loss: 1.1738	LR: 0.097790
Training Epoch: 1 [45824/46250]	Loss: 1.0886	LR: 0.098343
Training Epoch: 1 [46080/46250]	Loss: 1.2327	LR: 0.098895
Training Epoch: 1 [46250/46250]	Loss: 1.3285	LR: 0.099448
Epoch 1 - Average Train Loss: 1.5778, Train Accuracy: 0.4283
Epoch 1 training time consumed: 18.03s
Evaluating Network.....
Test set: Epoch: 1, Average loss: 0.0082, Accuracy: 0.4256, Time consumed:0.93s
Saving weights file to checkpoint/retrain/ResNet18/Saturday_02_August_2025_12h_22m_58s/ResNet18-Cifar10-seed7-ret25-1-best.pth
Valid (Test) Dl:  10000
Train Dl:  50000
Retain Train Dl:  46250
Forget Train Dl:  3750
Retain Valid Dl:  46250
Forget Valid Dl:  3750
retain_prob Distribution: 10000 samples
test_prob Distribution: 10000 samples
forget_prob Distribution: 3750 samples
Set1 Distribution: 3750 samples
Set2 Distribution: 3750 samples
Set1 Distribution: 3750 samples
Set2 Distribution: 3750 samples
Set1 Distribution: 10000 samples
Set2 Distribution: 10000 samples
Set1 Distribution: 10000 samples
Set2 Distribution: 10000 samples
Test Accuracy: 42.734375
Retain Accuracy: 42.73447799682617
Zero-Retain Forget (ZRF): 0.8884470462799072
Membership Inference Attack (MIA): 0.46826666666666666
Forget vs Retain Membership Inference Attack (MIA): 0.56
Forget vs Test Membership Inference Attack (MIA): 0.488
Test vs Retain Membership Inference Attack (MIA): 0.513
Train vs Test Membership Inference Attack (MIA): 0.50475
Forget Set Accuracy (Df): 43.82718276977539
Method Execution Time: 912.47 seconds
